490 research outputs found

    Adoptive Transfer of Myeloid-Derived Suppressor Cells and T Cells in a Prostate Cancer Model

    Get PDF
    The adoptive transfer of immune cells for cancer, chronic infection, and autoimmunity is an emerging field that has shown promise in recent trials. The transgenic adenocarcinoma mouse prostate (TRAMP) is a classical mouse model of prostate cancer (PCa) and TRAMP cell lines were derived from a TRAMP mouse tumor. TRAMP-C2 is tumorigenic when subcutaneously (s.c.) grafted into syngeneic C57BL/6 host mice (Foster et al., 1997). This protocol will describe the adoptive transfer of purified CD11b(+)Gr1(+) double positive (DP) myeloid-derived suppressor cells (MDSC) and CD3(+) T cells in the TRAMP-C2 prostate cancer mouse model in order to establish the intrinsic functionality of these immune cells and to determine their role in tumorigenesis in vivo (Yan et al., 2014)

    Hypoxic conditions differentially regulate TAZ and YAP in cancer cells

    Get PDF
    The Hippo-YAP pathway is altered and implicated as an oncogenic signaling pathway in many human cancers. Hypoxia is an important microenvironmental factor that promotes tumorigenesis. However, the effects of hypoxia on the two most important Hippo-YAP effectors, YAP (Yes-associated protein) and TAZ (transcriptional co-activator with PDZ-binding motif), have not been reported. In this work, we demonstrated that TAZ was functionally involved in cell proliferation and/or migration in epithelial ovarian cancer (EOC) or human ovarian surface epithelial (HOSE) cells. Hypoxic conditions (1% O2 or hypoxia mimics) induced a reduction of YAP phosphorylation (S127) and total YAP expression in EOC cell lines OVCAR5 and SKOV3. However, these conditions up-regulated levels of S69 phosphorylated TAZ in EOC cells. The known TAZ kinases, Lats1 and Akt, were unlikely to be involved in up-regulated pTAZ by hypoxic conditions. Together, our data revealed new and differential regulating mechanisms of TAZ and YAP in cancer cells by hypoxia conditions

    Folded Polynomial Codes for Coded Distributed AA⊤AA^\top-Type Matrix Multiplication

    Full text link
    In this paper, due to the important value in practical applications, we consider the coded distributed matrix multiplication problem of computing AA⊤AA^\top in a distributed computing system with NN worker nodes and a master node, where the input matrices AA and A⊤A^\top are partitioned into pp-by-mm and mm-by-pp blocks of equal-size sub-matrices respectively. For effective straggler mitigation, we propose a novel computation strategy, named \emph{folded polynomial code}, which is obtained by modifying the entangled polynomial codes. Moreover, we characterize a lower bound on the optimal recovery threshold among all linear computation strategies when the underlying field is real number field, and our folded polynomial codes can achieve this bound in the case of m=1m=1. Compared with all known computation strategies for coded distributed matrix multiplication, our folded polynomial codes outperform them in terms of recovery threshold, download cost and decoding complexity.Comment: 14 pages, 2 tabl

    Impaired plasma lipid profiles in acute hepatitis

    Get PDF
    The present study examined plasma lipid profiles in thirty patients suffered from acute viral hepatitis. Patients' blood samples were collected at both the debut and recovery of diseases. Thirty sex and age matched normal subjects were included as controls. Plasma total triglycerides (TG), total cholesterol, high density lipoprotein cholesterol (HDL-C), low density lipoprotein cholesterol (LDL-C), apolipoprotein AI (ApoAI), apolipoprotein B (ApoB), lipoprotein (a) (Lp(a)), blood coagulation status including prothrombin complex activity and activated partial tromboplastin time (APTT), and hepatic functions were determined by the automatic biochemical analytical instrument. It demonstrated that plasma levels of total cholesterol, HDL-C and apoAI were significantly lower in the patients at the acute phase of hepatitis than those in normal subjects, whereas plasma levels of TG and LDL-C were obviously higher in the patients than in normal subjects (P < 0.05). Moreover, we demonstrated that patients' plasma levels of total cholesterol, LDL-C, HDL-C and apoAI were lower at the active phase of the diseases than at the recovering phase, which indicating that acute liver damage could significant influence lipid metabolism in vivo. No pathological changes of blood coagulation status occurred in these patients during the study as all selected patients had moderate hepatitis. It may conclude that examinations of plasma lipid profile could be considered as a clinical index to reflect liver damage in the active phase of hepatitis

    B→KK∗B \to K K^{*} Decays in the Perturbative QCD Approach

    Full text link
    We calculate the branching ratios and CP-violating asymmetries for B^0 \to K^{0} \ov K^{*0}, \ov K^{0} K^{*0}, K+K∗−K^+ K^{*-}, K−K∗+K^- K^{*+}, and B^+\to K^+ \ov K^{*0} and \ov K^0 K^{*+} decays by employing the low energy effective Hamiltonian and the perturbative QCD (pQCD) factorization approach. The theoretical predictions for the branching ratios are Br(B^0/\ov B^0 \to K^{\pm} K^{*\mp}) \approx 7.4 \times 10^{-8}, Br(B^0/\ov B^0 \to K^{0} \ov K^{*0}(\ov K^{0} K^{*0})) \approx 19.6 \times 10^{-7}, Br(B^+\to K^+ \ov K^{*0}) \approx 3 \times 10^{-7} and Br(B^+\to K^{*+} \ov K^0) \approx 18.3 \times 10^{-7}, which are consistent with currently available experimental upper limits. We also predict large CP-violating asymmetries in these decays: A_{CP}^{dir}(K^\pm \ov K^{*0})\approx -20 %, A_{CP}^{dir}(K^{*\pm} \ov K^0)\approx -49%, which can be tested by the forthcoming B meson experiments.Comment: 25 pages, 7 figures, RevTex, some corrections on the numerical results and contents, typos removed, new references adde

    Improving Few-shot and Zero-shot Entity Linking with Coarse-to-Fine Lexicon-based Retriever

    Full text link
    Few-shot and zero-shot entity linking focus on the tail and emerging entities, which are more challenging but closer to real-world scenarios. The mainstream method is the ''retrieve and rerank'' two-stage framework. In this paper, we propose a coarse-to-fine lexicon-based retriever to retrieve entity candidates in an effective manner, which operates in two layers. The first layer retrieves coarse-grained candidates by leveraging entity names, while the second layer narrows down the search to fine-grained candidates within the coarse-grained ones. In addition, this second layer utilizes entity descriptions to effectively disambiguate tail or new entities that share names with existing popular entities. Experimental results indicate that our approach can obtain superior performance without requiring extensive finetuning in the retrieval stage. Notably, our approach ranks the 1st in NLPCC 2023 Shared Task 6 on Chinese Few-shot and Zero-shot Entity Linking.Comment: Accepted to NLPCC202

    E2Net: Resource-Efficient Continual Learning with Elastic Expansion Network

    Full text link
    Continual Learning methods are designed to learn new tasks without erasing previous knowledge. However, Continual Learning often requires massive computational power and storage capacity for satisfactory performance. In this paper, we propose a resource-efficient continual learning method called the Elastic Expansion Network (E2Net). Leveraging core subnet distillation and precise replay sample selection, E2Net achieves superior average accuracy and diminished forgetting within the same computational and storage constraints, all while minimizing processing time. In E2Net, we propose Representative Network Distillation to identify the representative core subnet by assessing parameter quantity and output similarity with the working network, distilling analogous subnets within the working network to mitigate reliance on rehearsal buffers and facilitating knowledge transfer across previous tasks. To enhance storage resource utilization, we then propose Subnet Constraint Experience Replay to optimize rehearsal efficiency through a sample storage strategy based on the structures of representative networks. Extensive experiments conducted predominantly on cloud environments with diverse datasets and also spanning the edge environment demonstrate that E2Net consistently outperforms state-of-the-art methods. In addition, our method outperforms competitors in terms of both storage and computational requirements

    A Two-Stage Framework with Self-Supervised Distillation For Cross-Domain Text Classification

    Full text link
    Cross-domain text classification aims to adapt models to a target domain that lacks labeled data. It leverages or reuses rich labeled data from the different but related source domain(s) and unlabeled data from the target domain. To this end, previous work focuses on either extracting domain-invariant features or task-agnostic features, ignoring domain-aware features that may be present in the target domain and could be useful for the downstream task. In this paper, we propose a two-stage framework for cross-domain text classification. In the first stage, we finetune the model with mask language modeling (MLM) and labeled data from the source domain. In the second stage, we further fine-tune the model with self-supervised distillation (SSD) and unlabeled data from the target domain. We evaluate its performance on a public cross-domain text classification benchmark and the experiment results show that our method achieves new state-of-the-art results for both single-source domain adaptations (94.17% ↑\uparrow1.03%) and multi-source domain adaptations (95.09% ↑\uparrow1.34%)
    • …
    corecore